Semi-supervised Learning with Ladder Networks

نویسندگان

  • Antti Rasmus
  • Mathias Berglund
  • Mikko Honkala
  • Harri Valpola
  • Tapani Raiko
چکیده

We combine supervised learning with unsupervised learning in deep neural networks. The proposed model is trained to simultaneously minimize the sum of supervised and unsupervised cost functions by backpropagation, avoiding the need for layer-wise pre-training. Our work builds on top of the Ladder network proposed by Valpola [1] which we extend by combining the model with supervision. We show that the resulting model reaches state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification in addition to permutationinvariant MNIST classification with all labels.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Semi-supervised Phoneme Recognition with Recurrent Ladder Networks

Ladder networks are a notable new concept in the field of semi-supervised learning by showing state-of-the-art results in image recognition tasks while being compatible with many existing neural architectures. We present the recurrent ladder network, a novel modification of the ladder network, for semi-supervised learning of recurrent neural networks which we evaluate with a phoneme recognition...

متن کامل

Progressive Ladder Networks for Semi-Supervised Transfer Learning

Semi-supervised learning has achieved remarkable success in the past few years at harnessing the power of unlabeled data and tackling domains where few labeled data examples exist. We test the hypothesis that deep semisupervised architectures learn general representations. We combine two well-known techniques for semi-supervised and transfer learning, ladder networks and progressive neural netw...

متن کامل

Virtual Adversarial Ladder Networks For Semi-supervised Learning

Semi-supervised learning (SSL) partially circumvents the high cost of labelling data by augmenting a small labeled dataset with a large and relatively cheap unlabeled dataset drawn from the same distribution. This paper offers a novel interpretation of two deep learning-based SSL approaches, ladder networks and virtual adversarial training (VAT), as applying distributional smoothing to their re...

متن کامل

Recurrent Ladder Networks

We propose a recurrent extension of the Ladder networks [22] whose structure is motivated by the inference required in hierarchical latent variable models. We demonstrate that the recurrent Ladder is able to handle a wide variety of complex learning tasks that benefit from iterative inference and temporal modeling. The architecture shows close-to-optimal results on temporal modeling of video da...

متن کامل

Adversarial Ladder Networks

The use of unsupervised data in addition to supervised data has lead to a significant improvement when training discriminative neural networks. However, the best results were achieved with a training process that is divided in two parts: first an unsupervised pre-training step is done for initializing the weights of the network and after these weights are refined with the use of supervised data...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015